Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[EKS] [Dashboard]: Kubernetes services dashboard in AWS console #135

Closed
cgswong opened this issue Jan 28, 2019 · 13 comments
Closed

[EKS] [Dashboard]: Kubernetes services dashboard in AWS console #135

cgswong opened this issue Jan 28, 2019 · 13 comments
Labels
Console AWS Container Services Console EKS Amazon Elastic Kubernetes Service Proposed Community submitted issue

Comments

@cgswong
Copy link

cgswong commented Jan 28, 2019

Tell us about your request
Kubernetes dashboard, similar to actual kubernetes dashboard as part of the EKS UI in AWS console.

Which service(s) is this request for?
EKS

Tell us about the problem you're trying to solve. What are you trying to do, and why is it hard?
Make is easier to see and manage, within the AWS console, the state of the various kubernetes components, pods and services. Similar to the Kubernetes dashboard. This is part of the poor onboarding experience, as post cluster installation, we do install the Kubernetes dashboard. However, this is less than ideal and requires CLI commands each time to see the dashboard. GKE is much better than EKS in this regard, so this feature will bring parity, if not better (depending on implementation).

Are you currently working around this issue?
Install Kubernetes dashboard post cluster installation, run proxy command each time to view dashboard.

@cgswong cgswong added the Proposed Community submitted issue label Jan 28, 2019
@micahlmartin
Copy link

It would definitely be nice to have a built-in dashboard but you could setup an ingress for it so you wouldn't have to run the proxy.

@cgswong
Copy link
Author

cgswong commented Jan 29, 2019

@micahlmartin That is true! I completely overlooked that aspect. My request still remains though as such a built-in dashboard in the AWS Console would greatly improve the cluster setup experience and for most eliminate having to add the Kubernetes dashboard installation step.

@ghost
Copy link

ghost commented Jan 29, 2019

I agree, we have to add the kubernetes dashboard for every cluster we spin up, having it available as part of an EKS spin up in the AWS console would be a tremendous value add and one less thing for us to manage. Or at least a flag to enable it.

@abby-fuller abby-fuller added the EKS Amazon Elastic Kubernetes Service label Jan 30, 2019
@mikehalof
Copy link

@dbayendor-cb
I do have the same requirement and need to automated that deployment.
Are you doing that with automation? can you share? It will very helpful for me do not re-invent it

Mike

@ghost
Copy link

ghost commented Feb 1, 2019

@mikehalof We deploy the kubernetes dashboard with the current stable helm-chart, kicked off from helmsman into our clusters at build time, and maintained from there. We override values so there's one for the admin that cluster-wide, and one for each team's namespace that has limited RBAC. Then expose the service. Seems like a lot of work. Also, we have to add in heapster to get the sparklines and graphs working. Still waiting for that to be deprecated and use metrics server, which we have installed to handle HPA, scaling, etc.

@abby-fuller abby-fuller added the Console AWS Container Services Console label Feb 13, 2019
@tabern tabern changed the title [EKS] [Dashboad]: Kubernetes services dashboard in AWS console [EKS] [Dashboard]: Kubernetes services dashboard in AWS console Nov 15, 2020
@tabern
Copy link
Contributor

tabern commented Dec 1, 2020

You can now see Kubernetes API resources and applications running on your Amazon EKS cluster using the AWS Management Console. This makes it easy to visualize and troubleshoot Kubernetes applications using Amazon EKS.

We're starting today with Workloads, and will be rapidly adding more resources and capabilities.

Learn more:

Check out the new console and let us know what you think! We'll be closing this issue, if there's a feature or enhancement you'd like to see us build for the new console, please open an issue!

@tabern tabern closed this as completed Dec 1, 2020
@groodt
Copy link

groodt commented Dec 1, 2020

@tabern Is a new IAM permission required to access this? I'm seeing an error that I've never seen before.

 is not authorized to perform: eks:AccessKubernetesApi on resource: arn:aws:eks

@jlbutler
Copy link

jlbutler commented Dec 2, 2020

@groodt if you use a custom policy with granular permissions for EKS console, you'll need to add that one. this is used to bridge the gap from console into cluster. from there, rbac in the cluster takes over, so also ensure the console credentials are present in the cluster auth map. let us know if that doesn't resolve things for you!

@mtparet
Copy link

mtparet commented Dec 3, 2020

We have a similar issue:

Error loading Namespaces
Unauthorized: Verify you have access to the Kubernetes cluster

We do not use custom policy, I have full admin rights on this AWS account.

@Dudssource
Copy link

@mtparet my guess is that you may need to add your IAM role/user to your cluster's aws-auth config map, just like @jlbutler pointed out.

Reference:
https://docs.aws.amazon.com/eks/latest/userguide/add-user-role.html

@mtparet
Copy link

mtparet commented Dec 3, 2020

Ok, before that I didn't need it to manage the cluster.

Again something to do manually...

@Dudssource
Copy link

@mtparet unfortunately yes :(
But for example if you are using eksctl to manage your cluster, there's an option to use it to do this for you
(https://eksctl.io/usage/iam-identity-mappings/).
Or if you are a terraform user that rely on the community module it's possible too (map_roles and map_users).
Also there's an issue #185 that targets the hability to manage the aws-auth config map with CloudFormation.

@mreferre
Copy link

mreferre commented Dec 3, 2020

@mtparet the cluster itself is an AWS object and so IAM roles/users/policies apply. The cluster internal objects (pods, services etc) are Kubernetes objects that needs Kubernetes authorization to access (via adding the IAM role/user to the Kubernetes authentication mechanism). Point taken that this should be a smoother experience.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Console AWS Container Services Console EKS Amazon Elastic Kubernetes Service Proposed Community submitted issue
Projects
None yet
Development

No branches or pull requests

10 participants