Skip to content

Latest commit

 

History

History
16 lines (13 loc) · 872 Bytes

README.md

File metadata and controls

16 lines (13 loc) · 872 Bytes

opensearch-neural-sparse-sample

Containing sample codes to trace/deploy/fine-tuning neural sparse model for opensearch using PyTorch and transformers API

What we have now

  • code sample to deploy a neural sparse model on SageMaker that can be accessed via OpenSearch connector
  • code sample to create an index for neural sparse then do ingest and search
    • combined with chunking processor

Todos

  • code sample to fine-tuning neural sparse model
  • code sample to trace a neural model that can be uploaded to OpenSearch cluster
  • code sample to create an index for neural sparse then do ingest and search
    • combined with dense model to do hybrid search

Code sample request

If you want some sample code not covered in this repo, please create a new issue in this repo to make request.