31 lines
939 B
Markdown
31 lines
939 B
Markdown
---
|
|
language:
|
|
- zh
|
|
license: apache-2.0
|
|
---
|
|
|
|
# Mengzi-T5 model (Chinese)
|
|
Pretrained model on 300G Chinese corpus.
|
|
|
|
[Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese](https://arxiv.org/abs/2110.06696)
|
|
|
|
## Usage
|
|
```python
|
|
from transformers import T5Tokenizer, T5ForConditionalGeneration
|
|
|
|
tokenizer = T5Tokenizer.from_pretrained("Langboat/mengzi-t5-base")
|
|
model = T5ForConditionalGeneration.from_pretrained("Langboat/mengzi-t5-base")
|
|
```
|
|
|
|
## Citation
|
|
If you find the technical report or resource is useful, please cite the following technical report in your paper.
|
|
```
|
|
@misc{zhang2021mengzi,
|
|
title={Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese},
|
|
author={Zhuosheng Zhang and Hanqing Zhang and Keming Chen and Yuhang Guo and Jingyun Hua and Yulong Wang and Ming Zhou},
|
|
year={2021},
|
|
eprint={2110.06696},
|
|
archivePrefix={arXiv},
|
|
primaryClass={cs.CL}
|
|
}
|
|
``` |