bug: fix order of spinkube params and url #5896
Open
Azure Pipelines / Agentbaker E2E
failed
Feb 22, 2025 in 20m 17s
Build #20250222.2 had test failures
Details
- Failed: 3 (4.11%)
- Passed: 70 (95.89%)
- Other: 0 (0.00%)
- Total: 73
Annotations
Check failure on line 10914 in Build log
azure-pipelines / Agentbaker E2E
Build log #L10914
Bash exited with code '1'.
Check failure on line 1 in Test_AzureLinuxV2_CustomSysctls
azure-pipelines / Agentbaker E2E
Test_AzureLinuxV2_CustomSysctls
Failed
Raw output
scenario_helpers_test.go:188: VHD: "/subscriptions/c4c3550e-a965-4993-a50c-628fd38cd3e1/resourceGroups/aksvhdtestbuildrg/providers/Microsoft.Compute/galleries/PackerSigGalleryEastUS/images/AzureLinuxV2gen2/versions/1.1740128692.26531", TAGS {Name:Test_AzureLinuxV2_CustomSysctls ImageName:AzureLinuxV2gen2 OS:azurelinux Arch:amd64 Airgap:false NonAnonymousACR:false GPU:false WASM:false ServerTLSBootstrapping:false KubeletCustomConfig:false}
vmss.go:39: creating VMSS "h3d1-2025-02-22-azurelinuxv2customsysctls" in resource group "MC_abe2e-westus3_abe2e-kubenet-331fc_westus3"
azure.go:497: creating VMSS h3d1-2025-02-22-azurelinuxv2customsysctls in resource group MC_abe2e-westus3_abe2e-kubenet-331fc_westus3
azure.go:510: created VMSS h3d1-2025-02-22-azurelinuxv2customsysctls in resource group MC_abe2e-westus3_abe2e-kubenet-331fc_westus3
scenario_helpers_test.go:146: vmss h3d1-2025-02-22-azurelinuxv2customsysctls creation succeeded
kube.go:147: waiting for node h3d1-2025-02-22-azurelinuxv2customsysctls to be ready
kube.go:168: node h3d1-2025-02-22-azurelinuxv2customsysctls000000 is tainted. Taints: [{"key":"node.kubernetes.io/network-unavailable","effect":"NoSchedule","timeAdded":"2025-02-22T01:56:46Z"}] Conditions: [{"type":"NetworkUnavailable","status":"True","lastHeartbeatTime":"2025-02-22T01:56:46Z","lastTransitionTime":"2025-02-22T01:56:46Z","reason":"NodeInitialization","message":"Waiting for cloud routes"},{"type":"MemoryPressure","status":"False","lastHeartbeatTime":"2025-02-22T01:56:36Z","lastTransitionTime":"2025-02-22T01:56:35Z","reason":"KubeletHasSufficientMemory","message":"kubelet has sufficient memory available"},{"type":"DiskPressure","status":"False","lastHeartbeatTime":"2025-02-22T01:56:36Z","lastTransitionTime":"2025-02-22T01:56:35Z","reason":"KubeletHasNoDiskPressure","message":"kubelet has no disk pressure"},{"type":"PIDPressure","status":"False","lastHeartbeatTime":"2025-02-22T01:56:36Z","lastTransitionTime":"2025-02-22T01:56:35Z","reason":"KubeletHasSufficientPID","message":"kubelet has sufficient PID available"},{"type":"Ready","status":"True","lastHeartbeatTime":"2025-02-22T01:56:36Z","lastTransitionTime":"2025-02-22T01:56:36Z","reason":"KubeletReady","message":"kubelet is posting ready status"}]
kube.go:188: failed to wait for "h3d1-2025-02-22-azurelinuxv2customsysctls" (h3d1-2025-02-22-azurelinuxv2customsysctls000000) to be ready {Capacity:map[cpu:{i:{value:2 scale:0} d:{Dec:<nil>} s:2 Format:DecimalSI} ephemeral-storage:{i:{value:52172304384 scale:0} d:{Dec:<nil>} s:50949516Ki Format:BinarySI} hugepages-1Gi:{i:{value:0 scale:0} d:{Dec:<nil>} s:0 Format:DecimalSI} hugepages-2Mi:{i:{value:0 scale:0} d:{Dec:<nil>} s:0 Format:DecimalSI} memory:{i:{value:8056139776 scale:0} d:{Dec:<nil>} s:7867324Ki Format:BinarySI} pods:{i:{value:110 scale:0} d:{Dec:<nil>} s:110 Format:DecimalSI}] Allocatable:map[cpu:{i:{value:1900 scale:-3} d:{Dec:<nil>} s:1900m Format:DecimalSI} ephemeral-storage:{i:{value:46955073868 scale:0} d:{Dec:<nil>} s:46955073868 Format:DecimalSI} hugepages-1Gi:{i:{value:0 scale:0} d:{Dec:<nil>} s:0 Format:DecimalSI} hugepages-2Mi:{i:{value:0 scale:0} d:{Dec:<nil>} s:0 Format:DecimalSI} memory:{i:{value:5552140288 scale:0} d:{Dec:<nil>} s:5422012Ki Format:BinarySI} pods:{i:{value:110 scale:0} d:{Dec:<nil>} s:110 Format:DecimalSI}] Phase: Conditions:[{Type:NetworkUnavailable Status:True LastHeartbeatTime:2025-02-22 01:56:46 +0000 UTC LastTransitionTime:2025-02-22 01:56:46 +0000 UTC Reason:NodeInitialization Message:Waiting for cloud routes} {Type:MemoryPressure Status:False LastHeartbeatTime:2025-02-22 01:56:36 +0000 UTC LastTransitionTime:2025-02-22 01:56:35 +0000 UTC Reason:KubeletHasSufficientMemory Message:kubelet has sufficient memory available} {Type:DiskPressure Status:False LastHeartbeatTime:2025-02-22 01:56:36 +0000 UTC LastTransitionTime:2025-02-22 01:56:35 +0000 UTC Reason:KubeletHasNoDiskPressure Message:kubelet has no disk pressure} {Type:PIDPressure Statu
Check failure on line 1 in Test_Ubuntu2204Gen2_ContainerdAirgappedK8sNotCached
azure-pipelines / Agentbaker E2E
Test_Ubuntu2204Gen2_ContainerdAirgappedK8sNotCached
Failed
Raw output
scenario_helpers_test.go:188: VHD: "/subscriptions/c4c3550e-a965-4993-a50c-628fd38cd3e1/resourceGroups/aksvhdtestbuildrg/providers/Microsoft.Compute/galleries/PackerSigGalleryEastUS/images/2204Gen2/versions/1.1725612526.29638", TAGS {Name:Test_Ubuntu2204Gen2_ContainerdAirgappedK8sNotCached ImageName:2204Gen2 OS:ubuntu Arch:amd64 Airgap:true NonAnonymousACR:false GPU:false WASM:false ServerTLSBootstrapping:false KubeletCustomConfig:false}
cluster.go:267: cluster abe2e-kubenet-airgap-64149 already exists in rg abe2e-westus3
cluster.go:123: node resource group: MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3
cluster.go:134: using private acr "privateacre2ewestus3" isAnonyomusPull false
aks_model.go:208: Creating private Azure Container Registry privateacre2ewestus3 in rg abe2e-westus3
aks_model.go:338: Checking if private Azure Container Registry cache rules are correct in rg abe2e-westus3
aks_model.go:353: Private ACR cache is correct
aks_model.go:217: Private ACR already exists at id /subscriptions/8ecadfc9-d1a3-4ea4-b844-0d9f87e4d7c8/resourceGroups/abe2e-westus3/providers/Microsoft.ContainerRegistry/registries/privateacre2ewestus3, skipping creation
aks_model.go:72: Adding network settings for airgap cluster abe2e-kubenet-airgap-64149 in rg MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3
aks_model.go:156: Checking if private endpoint for private container registry is in rg MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3
aks_model.go:197: Private Endpoint already exists with ID: /subscriptions/8ecadfc9-d1a3-4ea4-b844-0d9f87e4d7c8/resourceGroups/MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3/providers/Microsoft.Network/privateEndpoints/PE-for-ABE2ETests
aks_model.go:165: Private Endpoint already exists, skipping creation
aks_model.go:108: updated cluster abe2e-kubenet-airgap-64149 subnet with airgap settings
kube.go:370: Creating daemonset debug-mariner with image privateacre2ewestus3.azurecr.io/cbl-mariner/base/core:2.0
kube.go:370: Creating daemonset debugnonhost-mariner with image privateacre2ewestus3.azurecr.io/cbl-mariner/base/core:2.0
kube.go:85: waiting for pod app=debug-mariner in "default" namespace to be ready
kube.go:133: pod debug-mariner-f2rlb is ready
vmss.go:39: creating VMSS "go3j-2025-02-22-ubuntu2204gen2containerdairgappedk8snotca" in resource group "MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3"
azure.go:497: creating VMSS go3j-2025-02-22-ubuntu2204gen2containerdairgappedk8snotca in resource group MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3
azure.go:510: created VMSS go3j-2025-02-22-ubuntu2204gen2containerdairgappedk8snotca in resource group MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3
scenario_helpers_test.go:146: vmss go3j-2025-02-22-ubuntu2204gen2containerdairgappedk8snotca creation succeeded
kube.go:147: waiting for node go3j-2025-02-22-ubuntu2204gen2containerdairgappedk8snotca to be ready
kube.go:168: node go3j-2025-02-22-ubuntu2204gen2containerdairgappedk8snotca000000 is tainted. Taints: [{"key":"node.kubernetes.io/network-unavailable","effect":"NoSchedule","timeAdded":"2025-02-22T02:02:09Z"}] Conditions: [{"type":"NetworkUnavailable","status":"True","lastHeartbeatTime":"2025-02-22T02:02:09Z","lastTransitionTime":"2025-02-22T02:02:09Z","reason":"NodeInitialization","message":"Waiting for cloud routes"},{"type":"MemoryPressure","status":"False","lastHeartbeatTime":"2025-02-22T02:01:56Z","lastTransitionTime":"2025-02-22T02:01:55Z","reason":"KubeletHasSufficientMemory","message":"kubelet has sufficient memory available"},{"type":"DiskPressure","status":"False","lastHeartbeatTime":"2025-02-22T02:01:56Z","lastTransitionTime":"2025-02-22T02:01:55Z","reason":"KubeletHasNoDiskPressure","message":"kubelet has no disk pressure"},{"type":"PIDPressure","status":"False","lastHeartbeatTime":"2025-02-22T02:01:56Z","lastTransitionTime":"2025-02-22T02:01:55Z","reason":"KubeletHasSufficientPID
Check failure on line 1 in Test_Ubuntu2204_WASMAirGap
azure-pipelines / Agentbaker E2E
Test_Ubuntu2204_WASMAirGap
Failed
Raw output
scenario_helpers_test.go:188: VHD: "/subscriptions/c4c3550e-a965-4993-a50c-628fd38cd3e1/resourceGroups/aksvhdtestbuildrg/providers/Microsoft.Compute/galleries/PackerSigGalleryEastUS/images/2204gen2containerd/versions/1.1740128734.32301", TAGS {Name:Test_Ubuntu2204_WASMAirGap ImageName:2204gen2containerd OS:ubuntu Arch:amd64 Airgap:true NonAnonymousACR:false GPU:false WASM:false ServerTLSBootstrapping:false KubeletCustomConfig:false}
vmss.go:39: creating VMSS "ipte-2025-02-22-ubuntu2204wasmairgap" in resource group "MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3"
azure.go:497: creating VMSS ipte-2025-02-22-ubuntu2204wasmairgap in resource group MC_abe2e-westus3_abe2e-kubenet-airgap-64149_westus3
vmss.go:83:
Error Trace: /mnt/vss/_work/1/s/e2e/vmss.go:83
/mnt/vss/_work/1/s/e2e/scenario_helpers_test.go:142
/mnt/vss/_work/1/s/e2e/scenario_helpers_test.go:100
/mnt/vss/_work/1/s/e2e/scenario_test.go:1385
Error: Received unexpected error:
context deadline exceeded
Test: Test_Ubuntu2204_WASMAirGap
Messages: create vmss "ipte-2025-02-22-ubuntu2204wasmairgap", check scenario-logs-1740189197/Test_Ubuntu2204_WASMAirGap for vm logs
kube.go:85: waiting for pod app=debug-mariner in "default" namespace to be ready
kube.go:133: pod debug-mariner-f2rlb is ready
exec.go:91: Executing script script_file_391444d3-e085-4c74-9ad5-d44c98172556.sh using bash:
---START-SCRIPT---
sudo cat /var/log/azure/cluster-provision.log
---END-SCRIPT---
exec.go:91: Executing script script_file_a496db78-cc00-4d2c-b99f-b217a3f255b7.sh using bash:
---START-SCRIPT---
sudo journalctl -u kubelet
---END-SCRIPT---
exec.go:91: Executing script script_file_659fb253-94e4-4b37-ba01-0bed568cb2d9.sh using bash:
---START-SCRIPT---
sudo cat /var/log/azure/cluster-provision-cse-output.log
---END-SCRIPT---
exec.go:91: Executing script script_file_58a58182-3d02-4ef2-8bbd-aa5c75cca97d.sh using bash:
---START-SCRIPT---
sudo sysctl -a
---END-SCRIPT---
exec.go:91: Executing script script_file_095f391a-8a35-4870-8aae-14091f072237.sh using bash:
---START-SCRIPT---
sudo cat /var/log/azure/aks-node-controller.log
---END-SCRIPT---
vmss.go:294: vmss "ipte-2025-02-22-ubuntu2204wasmairgap" deleted successfully
Loading