Skip to content

Add :id to fog server extensions#192

Merged
stejskalleos merged 1 commit intotheforeman:masterfrom
stejskalleos:ls/id
Mar 3, 2026
Merged

Add :id to fog server extensions#192
stejskalleos merged 1 commit intotheforeman:masterfrom
stejskalleos:ls/id

Conversation

@stejskalleos
Copy link
Contributor

Before

hammer compute-resource virtual-machines --id 2

-----------------------------|--------
NAME                         | STATE  
-----------------------------|--------
vm1                            | stopped
vm2                            | running
-----------------------------|--------

After

hammer compute-resource virtual-machines --id 2

-----------------------------|------------------------------|--------
ID                                | NAME   | STATE  
-----------------------------|------------------------------|--------
vm1                            | vm1      | stopped
vm2                            | vm2      | running
-----------------------------|------------------------------|--------

cc @evgeni as the author of the fix

@evgeni
Copy link
Member

evgeni commented Mar 2, 2026

feels odd to ack that now…

Copy link
Member

@evgeni evgeni left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll still do it :)

Tested with rubygem-foreman_kubevirt-0.5.2-1.20260302114743781949.pr192.6.g78944a9.el9.noarch

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

This might be unrelated (or not), but calling virtual-machine info on these IDs doesn't work:

[root@ip-10-0-167-89 ~]# hammer compute-resource virtual-machine info --id 1 --vm-id centos-stream9-amd64-gray-jay-39
[root@ip-10-0-167-89 ~]# hammer compute-resource virtual-machine info --id 1 --vm-id rufus-vargas.example.com
404 Not Found
[root@ip-10-0-167-89 ~]# hammer compute-resource virtual-machine info --id 1 --vm-id centos-stream9-amd64-gray-jay-39
[root@ip-10-0-167-89 ~]# hammer compute-resource virtual-machine info --id 1 --vm-id jose-alnas.example.com
404 Not Found

Interestingly, the error is different than for a really wrong ID:

[root@ip-10-0-167-89 ~]# hammer compute-resource virtual-machine info --id 1 --vm-id sdfsdf
ERF42-6171 [Foreman::Exception]: Virtual machine was not found by id sdfsdf

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

When doing --debug, I see:

[ INFO 2026-03-03T02:09:01 API] GET /api/compute_resources/1/available_virtual_machines/centos-stream9-amd64-gray-jay-39
…
[DEBUG 2026-03-03T02:09:04 API] Response: {
           "namespace" => "evgeni-cnv",
                "name" => "centos-stream9-amd64-gray-jay-39",
    "resource_version" => "11544680609",
                 "uid" => "041f7ea0-d0b2-4550-a635-99f67af3ee71",
              "labels" => {
                                            "app" => "centos-stream9-amd64-gray-jay-39",
        "kubevirt.io/dynamic-credentials-support" => "true",
                        "vm.kubevirt.io/template" => "centos-stream9-server-small",
              "vm.kubevirt.io/template.namespace" => "openshift",
               "vm.kubevirt.io/template.revision" => "1",
                "vm.kubevirt.io/template.version" => "v0.34.1"
    },
               "disks" => [
        [0] {
                  "name" => "rootdisk",
            "boot_order" => nil,
                  "type" => "disk",
                   "bus" => "virtio",
              "readonly" => nil
        },
        [1] {
                  "name" => "cloudinitdisk",
            "boot_order" => nil,
                  "type" => "disk",
                   "bus" => "virtio",
              "readonly" => nil
        }
    ],
             "volumes" => [
        [0] {
                  "name" => "rootdisk",
                  "type" => "dataVolume",
                  "info" => "centos-stream9-amd64-gray-jay-39",
                "config" => {
                "name" => "centos-stream9-amd64-gray-jay-39"
            },
            "boot_order" => nil,
                   "bus" => "virtio"
        },
…

and

[ INFO 2026-03-03T02:10:50 API] GET /api/compute_resources/1/available_virtual_machines/jose-alnas.example.com
…
[ERROR 2026-03-03T02:10:50 API] 404 Not Found
[DEBUG 2026-03-03T02:10:50 API] {
    "status" => 404,
     "error" => "Not Found"
}

Quite sure it's not the fault of this PR, but will investigate

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

Ooooh, the 404 is because

 5436a649 | ActionController::RoutingError (No route matches [GET] "/api/compute_resources/1/available_virtual_machines/jose-alnas.example.com"):

Let's fix that! theforeman/foreman#10889

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

Huh, I thought it'd be better once I install rubygem-hammer_cli_foreman_kubevirt, but:

[ERROR 2026-03-03T02:34:01 Exception] Error: undefined method `provider_vm_specific_fields' for #<HammerCLIForemanKubevirt::ComputeResources::Kubevirt:0x00007f49db6beb40>
Did you mean?  provider_specific_fields
Error: undefined method `provider_vm_specific_fields' for #<HammerCLIForemanKubevirt::ComputeResources::Kubevirt:0x00007f49db6beb40>
Did you mean?  provider_specific_fields
[ERROR 2026-03-03T02:34:01 Exception] 

NoMethodError (undefined method `provider_vm_specific_fields' for #<HammerCLIForemanKubevirt::ComputeResources::Kubevirt:0x00007f49db6beb40>
Did you mean?  provider_specific_fields):
    /usr/share/gems/gems/hammer_cli_foreman-3.19.0.pre.develop/lib/hammer_cli_foreman/virtual_machine.rb:30:in `print_data'
    /usr/share/gems/gems/hammer_cli-3.19.0.pre.develop/lib/hammer_cli/apipie/command.rb:35:in `execute'
    /usr/share/gems/gems/clamp-1.4.0/lib/clamp/command.rb:66:in `run'
    /usr/share/gems/gems/hammer_cli-3.19.0.pre.develop/lib/hammer_cli/abstract.rb:103:in `run'
    /usr/share/gems/gems/clamp-1.4.0/lib/clamp/subcommand/execution.rb:18:in `execute'
    /usr/share/gems/gems/clamp-1.4.0/lib/clamp/command.rb:66:in `run'
    /usr/share/gems/gems/hammer_cli-3.19.0.pre.develop/lib/hammer_cli/abstract.rb:103:in `run'
    /usr/share/gems/gems/clamp-1.4.0/lib/clamp/subcommand/execution.rb:18:in `execute'
    /usr/share/gems/gems/clamp-1.4.0/lib/clamp/command.rb:66:in `run'
    /usr/share/gems/gems/hammer_cli-3.19.0.pre.develop/lib/hammer_cli/abstract.rb:103:in `run'
    /usr/share/gems/gems/clamp-1.4.0/lib/clamp/subcommand/execution.rb:18:in `execute'
    /usr/share/gems/gems/clamp-1.4.0/lib/clamp/command.rb:66:in `run'
    /usr/share/gems/gems/hammer_cli-3.19.0.pre.develop/lib/hammer_cli/abstract.rb:103:in `run'
    /usr/share/gems/gems/clamp-1.4.0/lib/clamp/command.rb:140:in `run'
    /usr/share/gems/gems/hammer_cli-3.19.0.pre.develop/bin/hammer:142:in `<top (required)>'
    /usr/bin/hammer:25:in `load'
    /usr/bin/hammer:25:in `<main>'

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

The empty print without hammer_cli_kubevirt is fixed by theforeman/hammer-cli-foreman#649

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

The crash with kubevirt is fixed by theforeman/hammer-cli-foreman#650

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

However, even after all the fixes, the id is not displayed in the hammer … info output:

[root@ip-10-0-167-89 ~]# hammer compute-resource virtual-machine info --id 1 --vm-id jose-alnas.example.com
Name: jose-alnas.example.com

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

Okay, I think that is because id is not declared as an attribute of the Fog model. See e.g. the libvirt compute:
https://github.com/fog/fog-libvirt/blob/40bd58484a461b5298b563211b9d629852ae9e8b/lib/fog/libvirt/models/compute/server.rb#L12

@evgeni
Copy link
Member

evgeni commented Mar 3, 2026

fwiw, I think it's fine to merge as-is and fix up the rest of hammer/api output later

@stejskalleos
Copy link
Contributor Author

fwiw, I think it's fine to merge as-is and fix up the rest of hammer/api output later

Yop, I'll review them next.

@stejskalleos stejskalleos merged commit 620e5df into theforeman:master Mar 3, 2026
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants