Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test: add tests for custom status readers #316

Merged
merged 1 commit into from
Nov 20, 2024
Merged

Conversation

zreigz
Copy link
Member

@zreigz zreigz commented Nov 20, 2024

Summary by CodeRabbit

  • New Features

    • Introduced comprehensive unit tests for the DeploymentResourceReader, ReplicaSetStatusReader, and StatefulSetResourceReader functionalities, enhancing the reliability of resource status reading.
  • Bug Fixes

    • Improved error handling for scenarios such as resource not found and context cancellation across all tested resource readers.

@zreigz zreigz added the enhancement New feature or request label Nov 20, 2024
Copy link

coderabbitai bot commented Nov 20, 2024

📝 Walkthrough

Walkthrough

This pull request introduces three new test files within the statusreaders_test package: deployment_test.go, replicaset_test.go, and statefulset_test.go. Each file contains unit tests for the respective resource readers (DeploymentResourceReader, ReplicaSetStatusReader, and StatefulSetResourceReader). The tests validate the ReadStatus method under various scenarios, including handling current resources, resource not found cases, and context cancellations. The tests utilize mock objects and the testify library for assertions, following a table-driven approach for ease of future test additions.

Changes

File Path Change Summary
internal/kstatus/statusreaders/deployment_test.go Added unit tests for DeploymentResourceReader, validating ReadStatus under various scenarios.
internal/kstatus/statusreaders/replicaset_test.go Added unit tests for ReplicaSetStatusReader, validating ReadStatus under various scenarios.
internal/kstatus/statusreaders/statefulset_test.go Added unit tests for StatefulSetResourceReader, validating ReadStatus under various scenarios.

Poem

🐇 In the garden of tests, we hop and play,
With deployments and replicas, we brighten the day.
Stateful sets join in, with statuses clear,
Each case a delight, bringing us cheer!
So let’s raise a carrot, and give a big cheer,
For tests that ensure our code is sincere! 🥕✨


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Outside diff range and nitpick comments (8)
internal/kstatus/statusreaders/deployment_test.go (2)

28-51: Consider improving test data maintainability

While the test data is comprehensive, consider these improvements:

  1. Add comments explaining the significance of specific fields in the test deployment
  2. Consider using heredoc syntax (``) with proper YAML indentation for better readability

Example improvement:

 var (
+  // currentDeployment represents a fully available deployment with:
+  // - 1 replica
+  // - All conditions met (Available and Progressing)
+  // - Matching generation and observedGeneration
   currentDeployment = strings.TrimSpace(`
     apiVersion: apps/v1
     kind: Deployment
     # ... rest of the YAML with proper indentation
   `)
 )

54-87: Consider adding more test cases for comprehensive coverage

The current test cases cover the basic scenarios well, but consider adding tests for:

  1. Invalid deployment status conditions
  2. Partial availability (not all replicas ready)
  3. Generation mismatch scenarios
  4. Invalid resource version
internal/kstatus/statusreaders/replicaset_test.go (3)

28-69: Consider simplifying the test data YAML

While the test data is comprehensive, consider removing fields that aren't essential for testing the status reader (e.g., creationTimestamp, terminationMessagePath, etc.). This would make the test data more focused and easier to maintain.

 currentReplicaset = strings.TrimSpace(`
 apiVersion: apps/v1
 kind: ReplicaSet
 metadata:
   labels:
     app: guestbook
-    plural.sh/managed-by: agent
     tier: frontend
   name: frontend
   namespace: test-do
   resourceVersion: "4869207"
   uid: 437e2329-59e4-42b9-ae40-48da3562d17e
 spec:
   replicas: 3
   selector:
     matchLabels:
       tier: frontend
   template:
     metadata:
-      creationTimestamp: null
       labels:
         tier: frontend
     spec:
       containers:
       - image: us-docker.pkg.dev/google-samples/containers/gke/gb-frontend:v5
-        imagePullPolicy: IfNotPresent
         name: php-redis
-        resources: {}
-        terminationMessagePath: /dev/termination-log
-        terminationMessagePolicy: File
-      dnsPolicy: ClusterFirst
-      restartPolicy: Always
-      schedulerName: default-scheduler
-      securityContext: {}
-      terminationGracePeriodSeconds: 30
 status:
   availableReplicas: 3
   fullyLabeledReplicas: 3
   observedGeneration: 1
   readyReplicas: 3
   replicas: 3
`)

72-105: Consider adding more test cases for edge scenarios

The current test cases cover the basic scenarios well. Consider adding tests for:

  1. Invalid resource version
  2. Generation mismatch between spec and status
  3. Partial availability (when not all replicas are ready)
  4. Zero replicas case

Example additional test case:

"Partial availability": {
    identifier: object.UnstructuredToObjMetadata(testutil.YamlToUnstructured(t, partialReplicaset)),
    readerResource: testutil.YamlToUnstructured(t, partialReplicaset),
    expectedResourceStatus: &event.ResourceStatus{
        Status:  status.InProgressStatus,
        Message: "ReplicaSet is not available. Ready: 2/3 replicas",
    },
},

119-126: Consider using testify's require package for error assertions

For better test failure messages, consider using require.Error and require.Equal for error assertions:

 if tc.expectedErr != nil {
-    if err == nil {
-        t.Errorf("expected error, but didn't get one")
-    } else {
-        assert.EqualError(t, err, tc.expectedErr.Error())
-    }
+    require.Error(t, err)
+    require.Equal(t, tc.expectedErr, err)
     return
 }
internal/kstatus/statusreaders/statefulset_test.go (3)

22-101: Consider simplifying test data and improving organization.

The StatefulSet manifest is quite detailed for a unit test. Consider:

  1. Simplifying the manifest to include only the fields necessary for testing the status reader
  2. Moving the test data to a separate test fixtures file (e.g., testdata/statefulset.yaml) for better maintainability

Example of a simplified manifest:

apiVersion: apps/v1
kind: StatefulSet
metadata:
  name: web
  namespace: test-do
spec:
  replicas: 3
  selector:
    matchLabels:
      app: nginx
status:
  availableReplicas: 0
  replicas: 1
  currentReplicas: 1

103-163: Consider adding more test coverage and improving test case descriptions.

The test implementation is good, but consider these improvements:

  1. Add test cases for edge scenarios:

    • StatefulSet with zero replicas
    • StatefulSet with status field missing
    • StatefulSet with generation mismatch
    • StatefulSet with failed pods
  2. Make test case names more descriptive, e.g.:

    • "In progress resource" → "should_report_in_progress_when_replicas_not_fully_available"
    • "Resource not found" → "should_report_not_found_when_statefulset_doesnt_exist"
  3. Add more specific assertions for the status message format

Example of additional test cases:

"should_report_in_progress_when_status_missing": {
    identifier: object.UnstructuredToObjMetadata(testutil.YamlToUnstructured(t, statefulsetWithoutStatus)),
    readerResource: testutil.YamlToUnstructured(t, statefulsetWithoutStatus),
    expectedResourceStatus: &event.ResourceStatus{
        Status: status.InProgressStatus,
        Message: "Status field not found",
    },
},
"should_report_in_progress_when_generation_mismatch": {
    identifier: object.UnstructuredToObjMetadata(testutil.YamlToUnstructured(t, statefulsetWithGenerationMismatch)),
    readerResource: testutil.YamlToUnstructured(t, statefulsetWithGenerationMismatch),
    expectedResourceStatus: &event.ResourceStatus{
        Status: status.InProgressStatus,
        Message: "Generation mismatch",
    },
},

1-163: Consider organizing tests into test suites for better structure.

Given that this is part of a larger set of status reader tests (including deployment and replicaset), consider using a test suite structure (e.g., using suite.Suite from testify) to share common test utilities and fixtures across all resource reader tests. This would:

  1. Reduce code duplication across test files
  2. Provide consistent test patterns for all resource readers
  3. Make it easier to add tests for new resource types

Example structure:

type StatusReaderTestSuite struct {
    suite.Suite
    fakeReader *fakecr.ClusterReader
    fakeMapper *fakemapper.FakeRESTMapper
}

func (s *StatusReaderTestSuite) SetupTest() {
    // Common setup
}

func (s *StatusReaderTestSuite) TestStatefulSetReadStatus() {
    // Current test implementation
}
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 0c0539a and 30ee9c2.

📒 Files selected for processing (3)
  • internal/kstatus/statusreaders/deployment_test.go (1 hunks)
  • internal/kstatus/statusreaders/replicaset_test.go (1 hunks)
  • internal/kstatus/statusreaders/statefulset_test.go (1 hunks)
🔇 Additional comments (4)
internal/kstatus/statusreaders/deployment_test.go (2)

1-20: LGTM! Well-structured imports and package organization.

The package structure and imports follow Go best practices and include all necessary dependencies.


89-114: LGTM! Well-structured test implementation

The test implementation follows best practices:

  • Uses table-driven tests for maintainability
  • Properly handles error cases
  • Uses appropriate test assertions
internal/kstatus/statusreaders/replicaset_test.go (1)

1-20: LGTM!

The package declaration and imports are well-organized and follow Go conventions.

internal/kstatus/statusreaders/statefulset_test.go (1)

1-20: LGTM! Package and imports are well-organized.

The package name follows Go conventions for test files, and imports are properly organized.

@zreigz zreigz merged commit a28ae92 into main Nov 20, 2024
27 of 34 checks passed
@zreigz zreigz deleted the tests-custom-status-readers branch November 20, 2024 09:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request size/L
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants