Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

User interface for @batched #2065

Merged
merged 33 commits into from
Aug 14, 2024
Merged

User interface for @batched #2065

merged 33 commits into from
Aug 14, 2024

Conversation

cathyzbn
Copy link
Contributor

@cathyzbn cathyzbn commented Aug 1, 2024

Describe your changes

Enable batching in modal functions and class methods.

Backward/forward compatibility checks

Check these boxes or delete any item (or this section) if not relevant for this PR.

  • Client+Server: this change is compatible with old servers
  • Client forward compatibility: this change ensures client can accept data intended for later versions of itself

Note on protobuf: protobuf message changes in one place may have impact to
multiple entities (client, server, worker, database). See points above.


Changelog

Added support for dynamic batching. Functions or class methods decorated with @modal.batched will now automatically batch their invocations together, up to a specified max_batch_size. The batch will wait for a maximum of wait_ms for more invocations after the first invocation is made. See guide for more details.

@app.function()
@modal.batched(max_batch_size=4, wait_ms=1000)
async def batched_multiply(xs: list[int], ys: list[int]) -> list[int]:
    return [x * y for x, y in zip(xs, xs)]

@app.cls()
class BatchedClass():
    @modal.batched(max_batch_size=4, wait_ms=1000)
    async def batched_multiply(xs: list[int], ys: list[int]) -> list[int]:
        return [x * y for x, y in zip(xs, xs)]

The batched function is called with individual inputs:

await batched_multiply.remote.aio(2, 3)

@cathyzbn cathyzbn changed the title Cathy/batch user interface User interface for @batched Aug 1, 2024
Copy link
Contributor

@mwaskom mwaskom left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a drive-by review with a few comments. There's a lot going on in this PR! Could potentially be easier to discuss with more atomic changes, although I appreciate that this is a complex feature and that it's helpful to test the live code on your branch!

modal/partial_function.py Outdated Show resolved Hide resolved
Comment on lines 578 to 579
max_batch_size: int,
max_wait_ms: int,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it ever make sense to leave this unset? e.g. to say "I always want batches of size n?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In that case the function might block forever. Now we have a upper limit of 10 minutes, would it be better to just let the user set it to 10 minutes?

test/cls_test.py Show resolved Hide resolved
modal/_container_entrypoint.py Outdated Show resolved Hide resolved
modal/_container_entrypoint.py Outdated Show resolved Hide resolved
modal/_container_entrypoint.py Outdated Show resolved Hide resolved
modal/_container_entrypoint.py Outdated Show resolved Hide resolved
modal/_container_entrypoint.py Outdated Show resolved Hide resolved
modal/_container_entrypoint.py Outdated Show resolved Hide resolved
modal/_container_io_manager.py Outdated Show resolved Hide resolved
@cathyzbn cathyzbn marked this pull request as ready for review August 8, 2024 19:39
@cathyzbn cathyzbn requested review from gongy and mwaskom August 12, 2024 16:19
Copy link
Contributor

@gongy gongy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the thorough tests. This looks great!

@cathyzbn cathyzbn merged commit 30cc1a8 into main Aug 14, 2024
25 checks passed
@cathyzbn cathyzbn deleted the cathy/batch_user_interface branch August 14, 2024 16:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants