Densifying Sparse Representations for Passage Retrieval by Representational Slicing
Paper • 2112.04666 • Published
How to use jacklin/DeLADE with Transformers:
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("jacklin/DeLADE", dtype="auto")YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
This model, DeLADE, is trained by fusing neural lexical and semantic components in single transformer using DistilBERT as a backbone. A Dense Representation Framework for Lexical and Semantic Matching Sheng-Chieh Lin and Jimmy Lin.
You can find the usage of the model in our DHR repo: (1) Inference on MSMARCO Passage Ranking; (2) Inference on BEIR datasets.