Skip to content

[exporter][batcher] Multi-batch support - Version 2 #12760

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

sfc-gh-sili
Copy link
Contributor

@sfc-gh-sili sfc-gh-sili commented Mar 29, 2025

Description

This PR introduces two new components

  • Partitioner - an interface for fetching batch key. A partitioner type should implement the function GetKey() which returns the batching key. Partitioner should be provided to the queue_bacher along with sizer in queue_batch::Settings.
  • multi_batcher. It supports key-based batching by routing the requests to a corresponding default_batcher. Each default_batcher corresponds to a shard described in Exporter batcher dynamic sharding of the partitions #12473.

Link to tracking issue

#12795

Testing

Documentation

Copy link

codecov bot commented Mar 29, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 91.57%. Comparing base (8a190ed) to head (57c510d).

Additional details and impacted files
@@            Coverage Diff             @@
##             main   #12760      +/-   ##
==========================================
+ Coverage   91.55%   91.57%   +0.02%     
==========================================
  Files         499      500       +1     
  Lines       27102    27169      +67     
==========================================
+ Hits        24814    24881      +67     
  Misses       1809     1809              
  Partials      479      479              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@sfc-gh-sili sfc-gh-sili force-pushed the sili-metadata-key-3 branch 6 times, most recently from 2834e5c to e501416 Compare April 1, 2025 05:22
@sfc-gh-sili sfc-gh-sili marked this pull request as ready for review April 1, 2025 05:23
@sfc-gh-sili sfc-gh-sili requested a review from a team as a code owner April 1, 2025 05:23
@sfc-gh-sili sfc-gh-sili changed the title [exporter][batcher] Multi-batch support - Version 2 [in discussion][exporter][batcher] Multi-batch support - Version 2 Apr 3, 2025
@sfc-gh-sili sfc-gh-sili force-pushed the sili-metadata-key-3 branch 2 times, most recently from 46eca7f to 182507f Compare April 4, 2025 20:47
}

func newMultiBatcher(bCfg BatchConfig, bSet batcherSettings[request.Request]) *multiBatcher {
// TODO: Determine what is the right behavior for this in combination with async queue.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What question(s) are there? I'm not sure I'm following.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment is duplicated from newDefaultBatcher(). IIUC, this is related to how goroutines are allocated when async queue and the batcher are used together. Right now:

  • AsyncQueue owns a goroutine pool of size n that is in charge of reading from the queue and calling Batcher::consume().
  • Batcher::consume() is in charge of appending the new item to the batch and invoking the flushing goroutine if needed
  • Batcher can allocate up to m goroutines for dispatching the active batch.

Both n and m come from the same config field sending_queue::num_consumers, but it does not necessarily make sense use the same number of goroutines for "reading from queue" and "dispatching the request'

@sfc-gh-sili sfc-gh-sili force-pushed the sili-metadata-key-3 branch 2 times, most recently from 2e966a1 to 04c0b89 Compare April 21, 2025 23:06
@sfc-gh-sili sfc-gh-sili force-pushed the sili-metadata-key-3 branch from 04c0b89 to 4db99d2 Compare April 21, 2025 23:14
@sfc-gh-sili sfc-gh-sili force-pushed the sili-metadata-key-3 branch from 0c09d43 to 57c510d Compare April 22, 2025 08:01
@sfc-gh-sili sfc-gh-sili changed the title [in discussion][exporter][batcher] Multi-batch support - Version 2 [exporter][batcher] Multi-batch support - Version 2 Apr 22, 2025
}
}

func (qb *multiBatcher) getShard(ctx context.Context, req request.Request) *defaultBatcher {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to have another batcher at all (multiBatcher)? I see significant amount code duplication. Maybe we just update defaultBacher instead where the getShard just returns a singleton if partitioner == nil?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants