Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Documentation Missing Cache Provider #90

Open
2 tasks done
amitamrutiya opened this issue Feb 14, 2024 · 2 comments
Open
2 tasks done

[Feature]: Documentation Missing Cache Provider #90

amitamrutiya opened this issue Feb 14, 2024 · 2 comments

Comments

@amitamrutiya
Copy link
Contributor

Checklist

  • I've searched for similar issues and couldn't find anything matching
  • I've discussed this feature request in the K8sGPT Slack and got positive feedback

Is this feature request related to a problem?

Yes

Problem Description

The docs currently does not list all the AI Cache Provider that are available in K8sGPT:
https://docs.k8sgpt.ai/explanation/caching/

The following backends are missing:

Microsoft Azure blob
Google Cloud Storage

Solution Description

I will create a pull request and link it to this issue, which will include the documentation for the missing cache providers.

Benefits

People will also know what to do for other cache providers as well.

Potential Drawbacks

N/A

Additional Information

N/A

@qdrddr
Copy link

qdrddr commented Mar 20, 2024

And

  1. The doc doesn't explicitly explain how AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY applied to the command. Doc should explain these are set and required as environmental variables in CLI.
  2. The same issue with the CLI tool: in the k8sgpt cache add --help, it doesn't tell you how to pass ACCESS_KEY & SECRET_ACCESS
  3. The CLI command itself doesn't show available correct keys in help k8sgpt cache add --help they are simply not there:
k8sgpt cache add --help

This command allows you to add a remote cache to store the results of an analysis.
        The supported cache types are:
        - Azure Blob storage
        - Google Cloud storage
        - S3

Documentation should have this info from the README:

Adding a remote cache

  • AWS S3
    • As a prerequisite AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are required as environmental variables.
    • Configuration, k8sgpt cache add s3 --region <aws region> --bucket <name>
      • K8sGPT will create the bucket if it does not exist
  • Azure Storage
    • We support a number of techniques to authenticate against Azure
    • Configuration, k8sgpt cache add azure --storageacc <storage account name> --container <container name>
      • K8sGPT assumes that the storage account already exist and it will create the container if it does not exist
      • It is the user responsibility have to grant specific permissions to their identity in order to be able to upload blob files and create SA containers (e.g Storage Blob Data Contributor)
  • Google Cloud Storage
    • As a prerequisite GOOGLE_APPLICATION_CREDENTIALS are required as environmental variables.
    • Configuration, k8sgpt cache add gcs --region <gcp region> --bucket <name> --projectid <project id>
      • K8sGPT will create the bucket if it does not exist

@AlexsJones
Copy link
Member

This is a really good point thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants