HuggingFaceFW/fineweb
Viewer • Updated • 52.5B • 923k • 2.79k
How to use diffusionfamily/diffugpt-m with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("diffusionfamily/diffugpt-m")
model = AutoModel.from_pretrained("diffusionfamily/diffugpt-m")This model is a fine-tuned version of gpt2-medium on an unknown dataset.
Details and model loading can be seen https://github.com/HKUNLP/DiffuLLaMA.
@misc{gong2024scalingdiffusionlanguagemodels,
title={Scaling Diffusion Language Models via Adaptation from Autoregressive Models},
author={Shansan Gong and Shivam Agarwal and Yizhe Zhang and Jiacheng Ye and Lin Zheng and Mukai Li and Chenxin An and Peilin Zhao and Wei Bi and Jiawei Han and Hao Peng and Lingpeng Kong},
year={2024},
eprint={2410.17891},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2410.17891},
}
Base model
openai-community/gpt2-medium