mengzi-t5-base
Go to file
xxl 6607382a85 first commit 2024-11-13 10:52:08 +08:00
.gitattributes Add .gitattributes 2024-11-13 10:44:58 +08:00
README.md first commit 2024-11-13 10:52:08 +08:00
config.json first commit 2024-11-13 10:52:08 +08:00
model.safetensors first commit 2024-11-13 10:52:08 +08:00
pytorch_model.bin first commit 2024-11-13 10:52:08 +08:00
spiece.model first commit 2024-11-13 10:52:08 +08:00
spiece.vocab first commit 2024-11-13 10:52:08 +08:00

README.md

language license
zh
apache-2.0

Mengzi-T5 model (Chinese)

Pretrained model on 300G Chinese corpus.

Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese

Usage

from transformers import T5Tokenizer, T5ForConditionalGeneration

tokenizer = T5Tokenizer.from_pretrained("Langboat/mengzi-t5-base")
model = T5ForConditionalGeneration.from_pretrained("Langboat/mengzi-t5-base")

Citation

If you find the technical report or resource is useful, please cite the following technical report in your paper.

@misc{zhang2021mengzi,
      title={Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese}, 
      author={Zhuosheng Zhang and Hanqing Zhang and Keming Chen and Yuhang Guo and Jingyun Hua and Yulong Wang and Ming Zhou},
      year={2021},
      eprint={2110.06696},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}