Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not collect pod metrics using Bottlerocket #1071

Closed
SpringMT opened this issue Mar 20, 2022 · 7 comments
Closed

Do not collect pod metrics using Bottlerocket #1071

SpringMT opened this issue Mar 20, 2022 · 7 comments
Assignees
Labels
EKS EKS related request metrics Metrics related issue

Comments

@SpringMT
Copy link
Contributor

SpringMT commented Mar 20, 2022

Describe the bug
When using Bottlerocket in EKS, otel-collector can not collect Pod metrics.
Log is bellow.

2022-03-20T14:04:42.986Z	warn	cadvisor/container_info_processor.go:93	No pod metric collected	{"kind": "receiver", "name": "awscontainerinsightreceiver", "metrics count": 27}

Steps to reproduce

  1. Create cluster with Bottlerocket
  2. Set aws-otel-collector with aws-otel-collector/deployment-template/eks/otel-container-insights-infra.yaml
  • I use amazon/aws-otel-collector:v0.17.0 image

What did you expect to see?
This bug is similar to this issue aws/amazon-cloudwatch-agent#188.
This issue fixed by aws/amazon-cloudwatch-agent#189.

aws-otel-collector uses container_info_processor in opentelemetry-collector-contrib.

https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/receiver/awscontainerinsightreceiver/internal/cadvisor/container_info_processor.go

I think the similar changes in aws/amazon-cloudwatch-agent#189 will fix this bug.

What did you see instead?

Environment
Bottlerocket in EKS

Additional context

@bryan-aguilar bryan-aguilar added metrics Metrics related issue EKS EKS related request labels Mar 21, 2022
@github-actions
Copy link
Contributor

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 30 days.

@mhausenblas
Copy link
Member

@SpringMT can you please confirm that this still happens with v0.18 of the ADOT collector?

@YDKK
Copy link

YDKK commented Jun 9, 2022

@mhausenblas I have same issue and have confirmed that this still happen with the amazon/aws-otel-collector:v0.18.0 image.

@YDKK
Copy link

YDKK commented Jun 9, 2022

As a workaround, I created a VolumeMount from /run/dockershim.sock to /run/containerd/containerd.sock (the same way as CloudWatch Agent) and it works fine.

@vasireddy99
Copy link
Contributor

vasireddy99 commented Jun 23, 2022

Adding context here:

Interesting that bottle rocket has fixed the mount point to containerd for kubernetes-1.23 support. so yes, if using latest version of bottle rocket mount needs to be fixed to /run/containerd/containerd.sock.

We are currently working on this issue and shall provide more context as soon as we have an update on resolving.

@vasireddy99
Copy link
Contributor

Thank you so much for creating this issue and appreciate for the good detail. The Support to collect pod metrics while running on EKS cluster with containerd runtime is available from ADOT collector v0.21. Closing this issue.

@SpringMT
Copy link
Contributor Author

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
EKS EKS related request metrics Metrics related issue
Projects
None yet
Development

No branches or pull requests

6 participants