Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Offline Batch Inference and Batch Ingestion #2840

Open
Zhangxunmt opened this issue Aug 20, 2024 · 0 comments
Open

[FEATURE] Offline Batch Inference and Batch Ingestion #2840

Zhangxunmt opened this issue Aug 20, 2024 · 0 comments
Assignees
Labels
2.17 enhancement New feature or request feature Roadmap:Cost/Performance/Scale Project-wide roadmap label v2.17.0

Comments

@Zhangxunmt
Copy link
Collaborator

Zhangxunmt commented Aug 20, 2024

Is your feature request related to a problem?
This feature is related to #1840, #2488. It's about ingesting all the batch inference results from files in S3, OpenAI, Cohere, etc, into the OpenSearch cluster. Batch inference was released in OpenSearch 2.16.

For more details and real case examples of the whole workflow, please refer to this RFC #2891.

Screenshot 2024-09-04 at 15 19 21
@Zhangxunmt Zhangxunmt added enhancement New feature or request untriaged feature and removed untriaged labels Aug 20, 2024
@Zhangxunmt Zhangxunmt self-assigned this Aug 20, 2024
@Zhangxunmt Zhangxunmt added Roadmap:Cost/Performance/Scale Project-wide roadmap label 2.17 labels Aug 26, 2024
@Zhangxunmt Zhangxunmt changed the title [FEATURE] Support Offline Batch Ingestion [FEATURE] Offline Batch Inference and Batch Ingestion Sep 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2.17 enhancement New feature or request feature Roadmap:Cost/Performance/Scale Project-wide roadmap label v2.17.0
Projects
Status: New
Status: In Progress
Development

No branches or pull requests

1 participant