Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reuse dictionaries in ActivityExtensions, MetricItemExtensions and OtlpLogExporter #5727

Closed

Conversation

paulomorgado
Copy link

Fixes #4943

I have to admit I don't know enough of the usage to know if this is really the best option, but it will reduce the number of allocations and dictionary internal growths.

Changes

Tries to use a single instance of a dictionary for ActivityExtensions, MetricItemExtensions and OtlpLogExporter.

Merge requirement checklist

  • CONTRIBUTING guidelines followed (license requirements, nullable enabled, static analysis, etc.)
  • Unit tests added/updated
  • Appropriate CHANGELOG.md files updated for non-trivial changes
  • Changes in public API reviewed (if applicable)

@paulomorgado paulomorgado requested a review from a team June 27, 2024 18:28
Copy link

linux-foundation-easycla bot commented Jun 27, 2024

CLA Signed

The committers listed above are authorized under a signed CLA.

  • ✅ login: paulomorgado / name: Paulo Morgado (f989e01)

@github-actions github-actions bot added the pkg:OpenTelemetry.Exporter.OpenTelemetryProtocol Issues related to OpenTelemetry.Exporter.OpenTelemetryProtocol NuGet package label Jun 27, 2024
@cijothomas
Copy link
Member

@paulomorgado Thanks!.
Could you check the benchmarks and share the results here : https://github.com/open-telemetry/opentelemetry-dotnet/tree/main/test/Benchmarks/Exporter so we can see before/after!

Resource = processResource,
};
request.ResourceSpans.Add(resourceSpans);
var spansByLibrary = Interlocked.Exchange(ref spansByLibraryCached, null) ?? new Dictionary<string, ScopeSpans>();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@paulomorgado - We could define this field as [ThreadStatic]

[ThreadStatic]
private static Dictionary<string, ScopeSpans>? spansByLibrary;
....
....
...

internal static void AddBatch(..)
{
     spansByLibrary ??= new Dictionary<string, ScopeSpans>();
..
..
..

Copy link
Author

@paulomorgado paulomorgado Jun 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vishweshbankwar,

@paulomorgado - We could define this field as [ThreadStatic]

[ThreadStatic]
private static Dictionary<string, ScopeSpans>? spansByLibrary;
....
....
...

internal static void AddBatch(..)
{
     spansByLibrary ??= new Dictionary<string, ScopeSpans>();
..
..
..

How many concurrent threads do you expect to be concurrently requesting these dictionaries?

With [ThreadStatic], the dictionaries will be created per-thread, depending on the thread that started this, which might not always be the same one, and might leak.

[ThreadStatic] is for when you are sure that you'll need it for every thread running the code.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a dedicated Exporter thread and that should be the only one invoking this code path.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can benchmark it, but accessing thread local storage might be worse than interlocked operations.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How many concurrent threads do you expect to be concurrently requesting these dictionaries?

it depends - With single instance of exporter configured as BatchExportProcessor. There would be only one thread executing this. With multiple instances, there could be multiple threads executing this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we could do that. That would require some refactoring and I would not recommend that effort as the implementation will change anyways with #5450.

is the concern that #5450 will change OTLP Exporter internals, so its better to not make any changes there for now?
If that is the case, then lets close this PR itself as we should not be proceeding with any approaches being discussed, until after the rewriting is complete.

Copy link
Contributor

@utpilla utpilla Jul 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a dedicated Exporter thread and that should be the only one invoking this code path.

Isn't this code path called for SimpleExportProcessor as well?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is the concern that #5450 will change OTLP Exporter internals, so its better to not make any changes there for now

Yes

If that is the case, then lets close this PR itself as we should not be proceeding with any approaches being discussed, until after the rewriting is complete

Agree. I was open to do the small change until #5450 completes. Already have #5731 now, we can close this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@utpilla

Isn't this code path called for SimpleExportProcessor as well?

Yes - that is also called within the lock right?

Copy link
Contributor

@utpilla utpilla Jul 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup. I was mainly concerned with HashMap being static. The issue would mainly be with multiple instances of exporter:

  • Multiple SimpleExportProcessor instances
  • Multiple BatchExportProcessor instances
  • Or a combination of both

@vishweshbankwar I just noticed you already covered this concern in one of your comments above.

@paulomorgado
Copy link
Author

@cijothomas,

@paulomorgado Thanks!. Could you check the benchmarks and share the results here : https://github.com/open-telemetry/opentelemetry-dotnet/tree/main/test/Benchmarks/Exporter so we can see before/after!

I've collected the benchmark results. They're a lot.

Are you looking for any particular ones?

@cijothomas
Copy link
Member

cijothomas commented Jun 28, 2024

@cijothomas,

@paulomorgado Thanks!. Could you check the benchmarks and share the results here : https://github.com/open-telemetry/opentelemetry-dotnet/tree/main/test/Benchmarks/Exporter so we can see before/after!

I've collected the benchmark results. They're a lot.

Are you looking for any particular ones?

Could you share, so it is easy to tell how much we are improving? (also ensures we don't accidentally regress!)
Example : #5457 (comment)

There are few OTLPExporter specific benchmarks:
https://github.com/open-telemetry/opentelemetry-dotnet/blob/main/test/Benchmarks/Exporter/OtlpTraceExporterBenchmarks.cs

@vishweshbankwar
Copy link
Member

@paulomorgado - Thank you for your time, appreciate the help. As discussed #5727 (comment), I am closing this PR as the implementation is soon going to change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pkg:OpenTelemetry.Exporter.OpenTelemetryProtocol Issues related to OpenTelemetry.Exporter.OpenTelemetryProtocol NuGet package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Optimize alloc in OTLP
4 participants